Kullback–Leibler divergence

Results: 486



#Item
241Information theory / Randomness / Estimation theory / Statistical inference / Entropy / Principle of maximum entropy / Kullback–Leibler divergence / Maximum likelihood / Mutual information / Statistics / Probability and statistics / Statistical theory

How biased are maximum entropy models? Jakob H. Macke Gatsby Computational Neuroscience Unit University College London, UK [removed]

Add to Reading List

Source URL: www.gatsby.ucl.ac.uk

Language: English - Date: 2011-11-16 10:58:04
242Statistical theory / Thermodynamics / Detection theory / Physics / Game theory / Statistics / Kullback–Leibler divergence

Judgment and Decision Making, Vol. 7, No. 2, March 2012, pp. 119–148 Information search with situation-specific reward functions Björn Meder∗ Jonathan D. Nelson∗

Add to Reading List

Source URL: journal.sjdm.org

Language: English - Date: 2012-03-31 13:00:11
243Bayesian statistics / Science / Neuroscience / Cognitive psychology / Psychophysics / Kullback–Leibler divergence / Prior probability / Entailment / Bayesian inference / Statistics / Logic / Statistical theory

Mind Reading by Machine Learning: A Doubly Bayesian Method for Inferring Mental Representations – supplementary material – Ferenc Husz´ ar ([removed]) Department of Engineering, University of Cambridge, Ca

Add to Reading List

Source URL: learning.eng.cam.ac.uk

Language: English - Date: 2010-08-25 06:18:29
244Mathematical sciences / Kullback–Leibler divergence / Statistical theory / Thermodynamics

Prediction with Limited Advice and Multiarmed Bandits with Paid Observations Full Version Including Appendices Yevgeny Seldin Queensland University of Technology and UC Berkeley

Add to Reading List

Source URL: jmlr.org

Language: English - Date: 2014-02-16 19:30:21
245Thermodynamics / Random variable / Probability and statistics / Measurement / Statistics / Kullback–Leibler divergence / Statistical theory

Two is better than one: distinct roles for familiarity and recollection when retrieving palimpsest memories – supplementary text – Cristina Savin1 [removed]

Add to Reading List

Source URL: learning.eng.cam.ac.uk

Language: English - Date: 2011-10-25 12:51:22
246Mathematical analysis / Statistical theory / Randomness / Kullback–Leibler divergence / Thermodynamics / Divergence / Mutual information / Entropy / Cross entropy / Statistics / Information theory / Mathematics

A FAMILY OF DISCRIMINATIVE TRAINING CRITERIA BASED ON THE F-DIVERGENCE FOR DEEP NEURAL NETWORKS Markus Nussbaum-Thom1,2 , Xiaodong Cui1 , Ralf Schl¨uter2 , Vaibhava Goel1 , Hermann Ney2,3 1 2

Add to Reading List

Source URL: www-i6.informatik.rwth-aachen.de

Language: English
247Kullback–Leibler divergence / Thermodynamics / Weight / Principle of maximum entropy / Mean / Exponential distribution / Maximum spacing estimation / Statistics / Statistical theory / Probability and statistics

Journal of Machine Learning Research[removed]Submitted 4/00; Published[removed]Likelihood and F-measure maximization under uncertainty Georgi Dimitroff

Add to Reading List

Source URL: www.ontotext.com

Language: English - Date: 2014-08-19 16:38:16
248Estimation theory / Expectation–maximization algorithm / Maximum likelihood / Partially observable Markov decision process / Parameter / Kullback–Leibler divergence / Normal distribution / Bayesian network / Dialogue / Statistics / Statistical theory / Bayesian statistics

PARAMETER LEARNING FOR POMDP SPOKEN DIALOGUE MODELS B. Thomson, F. Jurˇc´ıcˇ ek, M. Gaˇsi´c, S. Keizer, F. Mairesse, K. Yu, S. Young Cambridge University Engineering Department ABSTRACT The partially observable Mar

Add to Reading List

Source URL: mi.eng.cam.ac.uk

Language: English - Date: 2010-11-09 15:42:38
249Statistical models / Regression analysis / Markov models / Mixture model / Kullback–Leibler divergence / Variational Bayesian methods / Bayesian inference / Naive Bayes classifier / Principle of maximum entropy / Statistics / Bayesian statistics / Statistical theory

Cold-start Active Learning with Robust Ordinal Matrix Factorization Neil Houlsby1 Jos´e Miguel Hern´andez-Lobato1 Zoubin Ghahramani University of Cambridge, Department of Engineering, Cambridge CB2 1PZ, UK

Add to Reading List

Source URL: mlg.eng.cam.ac.uk

Language: English - Date: 2014-07-12 13:27:52
250Statistical theory / Kullback–Leibler divergence / Thermodynamics / Soundness / Mathematics / IP / Information theory / Proof theory / Logic / Science

Tight Parallel Repetition Theorems for Public-Coin Arguments using KL-divergence Kai-Min Chung1,⋆ and Rafael Pass2,⋆⋆ 1 Academia Sinica

Add to Reading List

Source URL: eprint.iacr.org

Language: English - Date: 2015-01-13 19:40:20
UPDATE